3,062 research outputs found

    A Unified Approach to Holomorphic Anomaly Equations and Quantum Spectral Curves

    Full text link
    We present a unified approach to holomorphic anomaly equations and some well-known quantum spectral curves. We develop a formalism of abstract quantum field theory based on the diagrammatics of the Deligne-Mumford moduli spaces M‾g,n\overline{{\mathcal M}}_{g,n} and derive a quadratic recursion relation for the abstract free energies in terms of the edge-cutting operators. This abstract quantum field theory can be realized by various choices of a sequence of holomorphic functions or formal power series and suitable propagators, and the realized quantum field theory can be represented by formal Gaussian integrals. Various applications are given.Comment: A section is adde

    A unified approach to hadron phenomenology at zero and finite temperatures in a hard-wall AdS/QCD model

    Full text link
    We propose a unified approach to study meson, nucleon and Δ\Delta-baryon properties at zero and finite temperatures in the context of hard-wall AdS/QCD model. We first combine some previous works dealing with mesons and baryons separately, and introduce a new parameter~ξ\xi so that the model could give a universal description of spectrum and couplings of both sectors in a self-consistent way. All observables calculated numerically show reasonable agreement with experimental data. We then study these observables at nonzero temperature by modifying the AdS space-time into AdS-Schwartzchild space-time. Numerically solving the model, we find an interesting temperature dependence of the spectrum and the couplings. We also make a prediction on the finite temperature decay width of some nucleon and Δ\Delta excited states.Comment: 19 latex pages, 5 figures, final version for publicatio

    Recurrent Neural Network Training with Dark Knowledge Transfer

    Full text link
    Recurrent neural networks (RNNs), particularly long short-term memory (LSTM), have gained much attention in automatic speech recognition (ASR). Although some successful stories have been reported, training RNNs remains highly challenging, especially with limited training data. Recent research found that a well-trained model can be used as a teacher to train other child models, by using the predictions generated by the teacher model as supervision. This knowledge transfer learning has been employed to train simple neural nets with a complex one, so that the final performance can reach a level that is infeasible to obtain by regular training. In this paper, we employ the knowledge transfer learning approach to train RNNs (precisely LSTM) using a deep neural network (DNN) model as the teacher. This is different from most of the existing research on knowledge transfer learning, since the teacher (DNN) is assumed to be weaker than the child (RNN); however, our experiments on an ASR task showed that it works fairly well: without applying any tricks on the learning scheme, this approach can train RNNs successfully even with limited training data.Comment: ICASSP 201
    • …
    corecore